Open Problem: Restricted Eigenvalue Condition for Heavy Tailed Designs

نویسندگان

  • Arindam Banerjee
  • Sheng Chen
  • Vidyashankar Sivakumar
چکیده

The restricted eigenvalue (RE) condition characterizes the sample complexity of accurate recovery in the context of high-dimensional estimators such as Lasso and Dantzig selector (Bickel et al., 2009). Recent work has shown that random design matrices drawn from any thin-tailed (subGaussian) distributions satisfy the RE condition with high probability, when the number of samples scale as the square of the Gaussian width of the restricted set (Banerjee et al., 2014; Tropp, 2015). We pose the equivalent question for heavy-tailed distributions: Given a random design matrix drawn from a heavy-tailed distribution satisfying the small-ball property (Mendelson, 2015), does the design matrix satisfy the RE condition with the same order of sample complexity as subGaussian distributions? An answer to the question will guide the design of high-dimensional estimators for heavy tailed problems.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Restricted Eigenvalue Properties for Correlated Gaussian Designs

Methods based on l1-relaxation, such as basis pursuit and the Lasso, are very popular for sparse regression in high dimensions. The conditions for success of these methods are now well-understood: (1) exact recovery in the noiseless setting is possible if and only if the design matrix X satisfies the restricted nullspace property, and (2) the squared l2-error of a Lasso estimate decays at the m...

متن کامل

Optimal Designs for Rational Models and Weighted Polynomial Regression by Holger Dette,

In this paper D-optimal designs for the weighted polynomial regresŽ 2 . n sion model of degree p with efficiency function 1 x are presented. Interest in these designs stems from the fact that they are equivalent to locally D-optimal designs for inverse quadratic polynomial models. For the unrestricted design space and p n, the D-optimal designs put equal masses on p 1 points which coincide with...

متن کامل

Some results on the symmetric doubly stochastic inverse eigenvalue problem

‎The symmetric doubly stochastic inverse eigenvalue problem (hereafter SDIEP) is to determine the necessary and sufficient conditions for an $n$-tuple $sigma=(1,lambda_{2},lambda_{3},ldots,lambda_{n})in mathbb{R}^{n}$ with $|lambda_{i}|leq 1,~i=1,2,ldots,n$‎, ‎to be the spectrum of an $ntimes n$ symmetric doubly stochastic matrix $A$‎. ‎If there exists an $ntimes n$ symmetric doubly stochastic ...

متن کامل

Using Kullback-Leibler distance for performance evaluation of search designs

This paper considers the search problem, introduced by Srivastava cite{Sr}. This is a model discrimination problem. In the context of search linear models, discrimination ability of search designs has been studied by several researchers. Some criteria have been developed to measure this capability, however, they are restricted in a sense of being able to work for searching only one possibl...

متن کامل

Restricted Eigenvalue from Stable Rank with Applications to Sparse Linear Regression

High-dimensional sparse linear regression is a basic problem in machine learning and statistics. Consider a linear model y = Xθ + w, where y ∈ R is the vector of observations, X ∈ R is the covariate matrix with ith row representing the covariates for the ith observation, and w ∈ R is an unknown noise vector. In many applications, the linear regression model is high-dimensional in nature, meanin...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015